Using Sampling and Simplex Derivatives in Pattern Search Methods

نویسندگان

  • A. L. Custódio
  • Luís N. Vicente
چکیده

Pattern search methods can be made more efficient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to improve the efficiency of a pattern search iteration. At each iteration of a pattern search method, one can attempt to compute an accurate simplex gradient by identifying a sampling set of previous iterates with good geometrical properties. This simplex gradient computation can be done using only past successful iterates or by considering all past function evaluations. The simplex gradient can then be used, for instance, to reorder the evaluations of the objective function associated with the positive spanning set or positive basis used in the poll step. But it can also be used to update the mesh size parameter according to a sufficient decrease criterion. None of these modifications demands new function evaluations. A search step can also be tried along the negative simplex gradient at the beginning of the current pattern search iteration. We will present these procedures in detail and show how promising they are to enhance the practical performance of pattern search methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Sampling and Simplex Derivatives in Pattern Search Methods (complete Numerical Results)

In this paper, we introduce a number of ways of making pattern search more efficient by reusing previous evaluations of the objective function, based on the computation of simplex derivatives (e.g., simplex gradients). At each iteration, one can attempt to compute an accurate simplex gradient by identifying a sampling set of previously evaluated points with good geometrical properties. This can...

متن کامل

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

DAMTP 2007/NA03 A view of algorithms for optimization without derivatives

Let the least value of the function F (x), x∈Rn, be required, where n ≥ 2. If the gradient ∇F is available, then one can tell whether search directions are downhill, and first order conditions help to identify the solution. It seems in practice, however, that the vast majority of unconstrained calculations do not employ any derivatives. A view of this situation is given, attention being restric...

متن کامل

Using Simplex Gradients of Nonsmooth Functions in Direct Search Methods

It has been shown recently that the efficiency of direct search methods that use opportunistic polling in positive spanning directions can be improved significantly by reordering the poll directions according to descent indicators built from simplex gradients. The purpose of this paper is twofold. First, we analyze the properties of simplex gradients of nonsmooth functions in the context of dir...

متن کامل

5 Computational Features from Evolutionary Operation to Parallel Direct Search: Pattern Search Algorithms for Numerical Optimization

variables may bedevil pattern search methods if there is a high degree of nonlinearity. On the other hand, pattern search methods have been applied successfully to problems with as many as 256 variables. Mead simplex method to a non-stationary point. Direct search methods for uncon-strained optimization on either sequential or parallel machines. Figure 5: A selection of admissible patterns and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 18  شماره 

صفحات  -

تاریخ انتشار 2007